Optimal stochastic control and Hamilton-Jacobi-Bellman equations

نویسندگان

چکیده

برای دانلود باید عضویت طلایی داشته باشید

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

Optimal Soaring with Hamilton-jacobi-bellman Equations∗

Competition glider flying, like other outdoor sports, is a game of stochastic optimization, in which mathematics and quantitative strategies have historically played an important role. We address the problem of uncertain future atmospheric conditions by constructing a nonlinear Hamilton-Jacobi-Bellman equation for the optimal speed to fly, with a free boundary describing the climb/cruise decisi...

متن کامل

Stochastic Equations with Delay: Optimal Control via BSDEs and Regular Solutions of Hamilton--Jacobi--Bellman Equations

We consider an Ito stochastic differential equation with delay, driven by brownian motion, whose solution, by an appropriate reformulation, defines a Markov process X with values in a space of continuous functions C, with generator L. We then consider a backward stochastic differential equation depending on X , with unknown processes (Y, Z), and we study properties of the resulting system, in p...

متن کامل

Hamilton-Jacobi-Bellman Equations

This work treats Hamilton-Jacobi-Bellman equations. Their relation to several problems in mathematics is presented and an introduction to viscosity solutions is given. The work of several research articles is reviewed, including the Barles-Souganidis convergence argument and the inaugural papers on mean-field games. Original research on numerical methods for Hamilton-Jacobi-Bellman equations is...

متن کامل

Hamilton-Jacobi-Bellman equations for Quantum Optimal Feedback Control

We exploit the separation of the …ltering and control aspects of quantum feedback control to consider the optimal control as a classical stochastic problem on the space of quantum states. We derive the corresponding Hamilton-Jacobi-Bellman equations using the elementary arguments of classical control theory and show that this is equivalent, in the Stratonovich calculus, to a stochastic Hamilton...

متن کامل

Hamilton-Jacobi-Bellman Equations and the Optimal Control of Stochastic Systems

In many applications (engineering, management, economy) one is led to control problems for stochastic systems : more precisely the state of the system is assumed to be described by the solution of stochastic differential equations and the control enters the coefficients of the equation. Using the dynamic programming principle E. Bellman [6] explained why, at least heuristically, the optimal cos...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

ژورنال

عنوان ژورنال: Banach Center Publications

سال: 1985

ISSN: 0137-6934,1730-6299

DOI: 10.4064/-14-1-313-318